List of Flash News about high bandwidth memory
| Time | Details |
|---|---|
|
2026-01-26 16:01 |
Azure Maia 200 AI Accelerator Now Live: 30% Better Performance per Dollar, Over 10 PFLOPS FP4 and 216GB HBM3e
According to @satyanadella, Microsoft’s Maia 200 AI accelerator is now online in Azure, designed for industry-leading inference efficiency with a stated 30% better performance per dollar than current systems, source: @satyanadella. The accelerator delivers over 10 PFLOPS FP4, about 5 PFLOPS FP8, and 216GB HBM3e with 7 TB/s memory bandwidth for high-throughput inference on Azure, source: @satyanadella. The release targets cost-efficient inference at scale within Azure’s AI infrastructure, source: @satyanadella. |